Sparse nonnegative tensor decomposition using proximal algorithm and inexact block coordinate descent scheme
نویسندگان
چکیده
Abstract Nonnegative tensor decomposition is a versatile tool for multiway data analysis, by which the extracted components are nonnegative and usually sparse. Nevertheless, sparsity only side effect cannot be explicitly controlled without additional regularization. In this paper, we investigated CANDECOMP/PARAFAC (NCP) with sparse regularization item using $$l_1$$ l 1 -norm (sparse NCP). When high imposed, factor matrices will contain more zero not of full column rank. Thus, NCP prone to rank deficiency, algorithms may converge. proposed novel model proximal algorithm. The subproblems in new strongly convex block coordinate descent (BCD) framework. Therefore, provides condition guarantees converge stationary point. addition, an inexact BCD scheme NCP, where each subproblem updated multiple times speed up computation. order prove effectiveness efficiency algorithm, employed two optimization solve model, including alternating quadratic programming hierarchical least squares. We evaluated methods experiments on synthetic, real-world, small-scale, large-scale data. experimental results demonstrate that our can efficiently impose matrices, extract meaningful components, outperform state-of-the-art methods.
منابع مشابه
A Fast Algorithm for Nonnegative Tensor Factorization using Block Coordinate Descent and an Active-set-type method
Nonnegative factorization of tensors plays an important role in the analysis of multi-dimensional data in which each element is inherently nonnegative. It provides a meaningful lower rank approximation, which can further be used for dimensionality reduction, data compression, text mining, or visualization. In this paper, we propose a fast algorithm for nonnegative tensor factorization (NTF) bas...
متن کاملInexact block coordinate descent methods with application to the nonnegative matrix factorization
This work is concerned with the cyclic block coordinate descent method, or nonlinear Gauss-Seidel method, where the solution of an optimization problem is achieved by partitioning the variables in blocks and successively minimizing with respect to each block. The properties of the objective function that guarantee the convergence of such alternating scheme have been widely investigated in the l...
متن کاملSparse non-negative tensor factorization using columnwise coordinate descent
Many applications in computer vision, biomedical informatics, and graphics deal with data in the matrix or tensor form. Non-negative matrix and tensor factorization, which extract data-dependent non-negative basis functions, have been commonly applied for the analysis of such data for data compression, visualization, and detection of hidden information (factors). In this paper, we present a fas...
متن کاملAlgorithms for nonnegative matrix and tensor factorizations: a unified view based on block coordinate descent framework
We review algorithms developed for nonnegativematrix factorization (NMF) and 1 nonnegative tensor factorization (NTF) from a unified view based on the block coordinate 2 descent (BCD) framework. NMF and NTF are low-rank approximation methods for matri3 ces and tensors in which the low-rank factors are constrained to have only nonnegative 4 elements. The nonnegativity constraints have been shown...
متن کاملBlock Coordinate Descent for Sparse NMF
Nonnegative matrix factorization (NMF) has become a ubiquitous tool for data analysis. An important variant is the sparse NMF problem which arises when we explicitly require the learnt features to be sparse. A natural measure of sparsity is the L0 norm, however its optimization is NP-hard. Mixed norms, such as L1/L2 measure, have been shown to model sparsity robustly, based on intuitive attribu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computing and Applications
سال: 2021
ISSN: ['0941-0643', '1433-3058']
DOI: https://doi.org/10.1007/s00521-021-06325-8